Deep Nearest Class Mean Classifiers

نویسندگان

  • Samantha Guerriero
  • Thomas Mensink
چکیده

In this paper we introduce DeepNCM, a Nearest Class Mean classification method enhanced to directly learn highly non-linear deep (visual) representations of the data. To overcome the computational expensive process of recomputing the class means after every update of the representation, we opt for approximating the class means with an online estimate. Moreover, to allow the class means to follow closely the drifting representation we introduce per epoch mean condensation. Using online class means with condensation, DeepNCM can train efficiently on large datasets. Our (preliminary) experimental results indicate that DeepNCM performs on par with SoftMax optimised networks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

From classifiers to discriminators: A nearest neighbor rule induced discriminant analysis

The current discriminant analysis method design is generally independent of classifiers, thus the connection between discriminant analysis methods and classifiers is loose. This paper provides a way to design discriminant analysis methods that are bound with classifiers. We begin with a local mean based nearest neighbor (LM-NN) classifier and use its decision rule to supervise the design of a d...

متن کامل

Deep Nearest Class Mean Model for Incremental Odor Classification

In recent years, more and more machine learning algorithms have been applied to odor recognition. These odor recognition algorithms usually assume that the training dataset is static. However, for some odor recognition tasks, the odor dataset is dynamically growing where not only the training samples but also the number of classes increase over time. Motivated by this concern, we proposed a dee...

متن کامل

Symbolic Nearest Mean Classifiers

The minimum-distance classifier summarizes each class with a prototype and then uses a nearest neighbor approach for classification. Three drawbacks of the original minimum-distance classifier are its inability to work with symbolic attributes, weigh attributes, and learn more than a single prototype for each class. The proposed solutions to these problems include defining the mean for symbolic...

متن کامل

Lazy Classifiers Using P-trees

Lazy classifiers store all of the training samples and do not build a classifier until a new sample needs to be classified. It differs from eager classifiers, such as decision tree induction, which build a general model (such as a decision tree) before receiving new samples. K-nearest neighbor (KNN) classification is a typical lazy classifier. Given a set of training data, a knearest neighbor c...

متن کامل

Feature scaling in support vector data description

When in a classification problem only samples of one class are easily accessible, this problem is called a one-class classification problem. Many standard classifiers, like backpropagation neural networks, fail on this data. Some other classifiers, like k-means clustering or nearest neighbor classifier can be applied after some minor changes. In this paper we focus on the support vector data de...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018